Fluctuation-response theorem for Kullback-Leibler divergences to quantify causation

نویسندگان

چکیده

We define a new measure of causation from fluctuation-response theorem for Kullback-Leibler divergences, based on the information-theoretic cost perturbations. This information response has both invariance properties required an and physical interpretation propagation In linear systems, reduces to transfer entropy, providing connection between Fisher mutual information.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Entropy and Kullback-Leibler divergence estimation based on Szegö's theorem

In this work, a new technique for the estimation of the Shannon’s entropy and the Kullback-Leibler (KL) divergence for one dimensional data is presented. The estimator is based on the Szegö’s theorem for sequences of Toeplitz matrices, which deals with the asymptotic behavior of the eigenvalues of those matrices, and the analogy between a probability density function (PDF) and a power spectral ...

متن کامل

Kullback-Leibler Divergence and the Central Limit Theorem

This paper investigates the asymptotics of Kullback-Leibler divergence between two probability distributions satisfying a Central Limit Theorem property. The basic problem is as follows. Let Xi, i ∈ N, be a sequence of independent random variables such that the sum Sn = ∑n i=1 Xi has the same expected value and satisfies the CLT under each probability distribution. Then what are the asymptotics...

متن کامل

Using Kullback-Leibler distance for performance evaluation of search designs

This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...

متن کامل

Kullback-Leibler Boosting

In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting has following properties. First, classification is based on the sum of histogram divergences along corresponding global and discriminating linear features. Second, these linear features, called KL features, are iteratively learnt by maximizing the projected Kullback-Leibler d...

متن کامل

Dialect Distance Assessment Based on 2-dimensional Pitch Slope Features and Kullback Leibler Divergences

Dialect variations of a language have a severe impact on the performance of speech systems. Therefore, knowing how close or separate dialects are in a given language space provides useful information to predict, or improve, system performance when there is a mismatch between train and test data. Distance measures have been used in several applications of speech processing, including speech reco...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: EPL

سال: 2021

ISSN: ['0295-5075', '1286-4854']

DOI: https://doi.org/10.1209/0295-5075/135/28002